Information Geometry and Statistical Inference D
نویسنده
چکیده
Variance and Fisher information are ingredients of the Cram er-Rao inequality. Fisher information is regarded as a Riemannian metric on a quantum statistical manifold and we choose monotonicity under coarse graining as the fundamental property. The quadratic cost functions are in a dual relation with the Fisher information quantities and they reduce to the variance in the commuting case. The scalar curvature in a certain geometry might be interpreted as an uncertainty on a statistical manifold. Information geometry has a surprising application to the theory of geometric mean of matrices. 1. The Cram er-Rao Inequality The Cram er-Rao inequality belongs to the basics of estimation theory in mathematical statistics. Its quantum analog was discovered immediately after the foundation of mathematical quantum estimation theory in the 1960's, see the book [10] of Helstrom, or the book [11] of Holevo for a rigorous summary of the subject. Although both the classical Cram er-Rao inequality and its quantum analog are as trivial as the Schwarz inequality, the subject takes a lot of attention because it is located on the highly exciting boundary of statistics, information and quantum theory. As a starting point we give a very general form of the quantum Cram er-Rao inequality in the simple setting of nite dimensional quantum mechanics. For 2 ( "; ") R a statistical operator is given and the aim is to estimate the value of the parameter close to 0. Formally is an n n positive semide nite matrix of trace 1 which describes a mixed state of a quantum mechanical system and we assume that is smooth (in ). Assume that an estimation is performed by the measurement of a self-adjoint matrix A playing the role of an observable. A is called locally unbiased estimator if (1) @ @ Tr A =0 = 1 : This condition holds if A is an unbiased estimator for , that is (2) Tr A = ( 2 ( "; ")): Supported by the Hungarian grant OTKA T032662.
منابع مشابه
Information Measures via Copula Functions
In applications of differential geometry to problems of parametric inference, the notion of divergence is often used to measure the separation between two parametric densities. Among them, in this paper, we will verify measures such as Kullback-Leibler information, J-divergence, Hellinger distance, -Divergence, … and so on. Properties and results related to distance between probability d...
متن کاملModeling of Weld Bead Geometry Using Adaptive Neuro-Fuzzy Inference System (ANFIS) in Additive Manufacturing
Additive Manufacturing describes the technologies that can produce a physical model out of a computer model with a layer-by-layer production process. Additive Manufacturing technologies, as compared to traditional manufacturing methods, have the high capability of manufacturing the complex components using minimum energy and minimum consumption. These technologies have brought about the possibi...
متن کاملA Geometry Preserving Kernel over Riemannian Manifolds
Abstract- Kernel trick and projection to tangent spaces are two choices for linearizing the data points lying on Riemannian manifolds. These approaches are used to provide the prerequisites for applying standard machine learning methods on Riemannian manifolds. Classical kernels implicitly project data to high dimensional feature space without considering the intrinsic geometry of data points. ...
متن کاملExtending the Qualitative Trajectory Calculus Based on the Concept of Accessibility of Moving Objects in the Paths
Qualitative spatial representation and reasoning are among the important capabilities in intelligent geospatial information system development. Although a large contribution to the study of moving objects has been attributed to the quantitative use and analysis of data, such calculations are ineffective when there is little inaccurate data on position and geometry or when explicitly explaining ...
متن کاملReview of the Applications of Exponential Family in Statistical Inference
In this paper, after introducing exponential family and a history of work done by researchers in the field of statistics, some applications of this family in statistical inference especially in estimation problem,statistical hypothesis testing and statistical information theory concepts will be discussed.
متن کاملBayesian Geometric Theory of Statistical Inference
Statistical estimation is studied in the framework of Bayesian decision theory and information geometry. Extending information deviation (divergence) to the space of nite measures, it is shown that for a given prior and sample there exist ideal estimates given by a posterior average. The error of any estimate is decomposed into the sum of its deviation from the ideal estimate and the error of t...
متن کامل